Avoiding Class-conditional Independence Assumptions in Image Classification
نویسنده
چکیده
Much published work on contextual image classification is based on an assumption of class-conditional independence (CCI) of the measurement data equivalent to assuming that an ideal classifier will recover the underlying true scene, with errors evenly and randomly distributed as white noise. This paper proposes a simple alternative model which, it is argued, is more realistic in many applications and upon which useful theory can still be built. The new model is then used to investigate the effect on the accuracy of an object classifier which makes the CCI assumption in a domain where it is not valid.
منابع مشابه
Supervised Segmentation by Iterated Contextual Pixel Classification
We propose a general iterative contextual pixel classifier for supervised image segmentation. The iterative procedure is statistically well-founded, and can be considered a variation on the iterated conditional modes (ICM) of Besag. Having an initial segmentation, the algorithm iteratively updates it by reclassifying every pixel, based on the original features and, additionally, contextual info...
متن کاملMultinomial Mixture Modelling for Bilingual Text Classification
Mixture modelling of class-conditional densities is a standard pattern classification technique. In text classification, the use of class-conditional multinomial mixtures can be seen as a generalisation of the Naive Bayes text classifier relaxing its (class-conditional feature) independence assumption. In this paper, we describe and compare several extensions of the class-conditional multinomia...
متن کاملBrain Tissue Classification of Magnetic Resonance Images Using Conditional Random Fields
In this project, I propose the application of a discriminative framework for segmentation of a T1weighted magnetic resonance image(MRI). The use of Gaussian mixture models (GMM) is fairly ubiquitous in processing brain images for statistical analysis. This generative framework makes several assumptions that restricts its success and application. GMM assumes there is no spatial correlation when ...
متن کاملbayeclass: an R package for learning Bayesian network classifiers. Applications to neuroscience
Machine learning provides tools for automatized analysis of data. The most commonly used form of machine learning is supervised classification. A supervised classifier learns a mapping from the descriptive features of an object to the set of possible classes, from a set of features-class pairs. Once learned, it is used to predict the class for novel data instances. The Bayesian network-based su...
متن کاملDistribution-Free Learning of Graphical Model Structure in Continuous Domains
In this paper we present a probabilistic non-parametric conditional independence test of X and Y given a third variable Z in domains where X, Y , and Z are continuous. This test can be used for the induction of the structure of a graphical model (such as a Bayesian or Markov network) from experimental data. We also provide an effective method for calculating it from data. We show that our metho...
متن کامل